A Subspace-Projected Approximate Matrix Method for Systems of Linear Equations
نویسندگان
چکیده
Given two n×n matrices A and A0 and a sequence of subspaces {0}=V0 ⊂ · · · ⊂ Vn = R n with dim(Vk) = k, the k-th subspace-projected approximated matrix Ak is defined as Ak = A + Πk(A0 − A)Πk , where Πk is the orthogonal projection on V ⊥ k . Consequently, Ak v = Av and v Ak = v ∗A for all v ∈ Vk. Thus (Ak) n k≥0 is a sequence of matrices that gradually changes from A0 into An = A. In principle, the definition of Vk+1 may depend on properties of Ak, which can be exploited to try to force Ak+1 to be closer to A in some specific sense. By choosing A0 as a simple approximation of A, this turns the subspace-approximated matrices into interesting preconditioners for linear algebra problems involving A. In the context of eigenvalue problems, they appeared in this role in Shepard et al. (2001), resulting in their Subspace Projected Approximate Matrix method. In this article, we investigate their use in solving linear systems of equations Ax = b. In particular, we seek conditions under which the solutions xk of the approximate systems Ak xk = b are computable at low computational cost, so the efficiency of the corresponding method is competitive with existing methods such as the Conjugate Gradient and the Minimal Residual methods. We also consider how well the sequence (xk)k≥0 approximates x , by performing some illustrative numerical tests. AMS subject classifications: 65F10, 65F08
منابع مشابه
Preconditioned Generalized Minimal Residual Method for Solving Fractional Advection-Diffusion Equation
Introduction Fractional differential equations (FDEs) have attracted much attention and have been widely used in the fields of finance, physics, image processing, and biology, etc. It is not always possible to find an analytical solution for such equations. The approximate solution or numerical scheme may be a good approach, particularly, the schemes in numerical linear algebra for solving ...
متن کاملUsing an Efficient Penalty Method for Solving Linear Least Square Problem with Nonlinear Constraints
In this paper, we use a penalty method for solving the linear least squares problem with nonlinear constraints. In each iteration of penalty methods for solving the problem, the calculation of projected Hessian matrix is required. Given that the objective function is linear least squares, projected Hessian matrix of the penalty function consists of two parts that the exact amount of a part of i...
متن کاملApproximate solution of system of nonlinear Volterra integro-differential equations by using Bernstein collocation method
This paper presents a numerical matrix method based on Bernstein polynomials (BPs) for approximate the solution of a system of m-th order nonlinear Volterra integro-differential equations under initial conditions. The approach is based on operational matrices of BPs. Using the collocation points,this approach reduces the systems of Volterra integro-differential equations associated with the giv...
متن کاملProjected Equation Methods for Approximate Solution of Large Linear Systems1
We consider linear systems of equations and solution approximations derived by projection on a lowdimensional subspace. We propose stochastic iterative algorithms, based on simulation, which converge to the approximate solution and are suitable for very large-dimensional problems. The algorithms are extensions of recent approximate dynamic programming methods, known as temporal difference metho...
متن کاملOn the convergence of the homotopy analysis method to solve the system of partial differential equations
One of the efficient and powerful schemes to solve linear and nonlinear equations is homotopy analysis method (HAM). In this work, we obtain the approximate solution of a system of partial differential equations (PDEs) by means of HAM. For this purpose, we develop the concept of HAM for a system of PDEs as a matrix form. Then, we prove the convergence theorem and apply the proposed method to fi...
متن کامل